IEEE TRANSACTIONS ON NEURAL NETWORKS VOL XX NO Y MONTH E cient Algorithms for Function Approximation with Piecewise Linear Sigmoidal Networks
نویسنده
چکیده
This paper presents a computationally e cient algorithm for function approximation with piecewise linear sigmoidal nodes A one hidden layer network is constructed one node at a time using the well known method of tting the residual The task of tting an individual node is accom plished using a new algorithm that searches for the best t by solving a sequence of Quadratic Programming problems This approach o ers signi cant advantages over derivative based search algorithms e g backpropagation and its ex tensions Unique characteristics of this algorithm include nite step convergence a simple stopping criterion solu tions that are independent of initial conditions good scaling properties and a robust numerical implementation Empir ical results are included to illustrate these characteristics
منابع مشابه
Efficient algorithms for function approximation with piecewise linear sigmoidal networks
This paper presents a computationally efficient algorithm for function approximation with piecewise linear sigmoidal nodes. A one hidden layer network is constructed one node at a time using the well-known method of fitting the residual. The task of fitting an individual node is accomplished using a new algorithm that searches for the best fit by solving a sequence of quadratic programming prob...
متن کاملSome queries on "Comments on 'Approximation capability in C(R¯n) by multilayer feedforward networks and related problems'"
In the comments letter by Huang et al. (ibid. vol.9 (1998)) the authors claimed that the boundedness of a sigmoidal function sigma(x) is neither sufficient nor necessary condition for the validity of the approximation theorem discussed in our original paper (ibid. vol.6 (1995)). In this paper, we show that this claim is incorrect.
متن کاملEffect of nonlinear transformations on correlation between weighted sums in multilayer perceptrons
Nonlinear transformation is one of the major obstacles to analyzing the properties of multilayer perceptrons. In this letter, we prove that the correlation coefficient between two jointly Gaussian random variables decreases when each of them is transformed under continuous nonlinear transformations, which can be approximated by piecewise linear functions. When the inputs or the weights of a mul...
متن کاملIEEE TRANSACTIONS ON INFORMATION THEORY VOL XX NO Y MONTH Error Bounds for Functional Approximation and Estimation Using Mixtures of Experts
We examine some mathematical aspects of learning unknown mappings with the Mixture of Experts Model MEM Speci cally we observe that the MEM is at least as powerful as a class of neural networks in a sense that will be made precise Upper bounds on the approximation error are established for a wide class of target functions The general theorem states that inf kf fnkp c n r d holds uniformly for f...
متن کاملExistence and uniqueness results for neural network approximations
Some approximation theoretic questions concerning a certain class of neural networks are considered. The networks considered are single input, single output, single hidden layer, feedforward neural networks with continuous sigmoidal activation functions, no input weights but with hidden layer thresholds and output layer weights. Specifically, questions of existence and uniqueness of best approx...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 1998